Use LLMs with watsonx.ai Locally

The ibm_watsonx_ai Python library allows you to work with IBM watsonx Machine Learning services, including LLMs. You can train, store, and deploy your models. You can also evaluate them using APIs. Additionally, it offers access to pre-trained, state-of-the-art LLMs for utilization via APIs, seamlessly integrating them into your application development process. Here is an introduction documentation.

Alt text

The package can be installed with pip:

  1. 1
  2. 2
  1. pip install ibm-watson-machine-learning
  2. # you can also specifiy the package version such as 1.1.20

For example, the following is a code snippet for creating a simple QA chat with Llama2 using watsonx.ai's machine learning library.

  1. 1
  2. 2
  3. 3
  4. 4
  5. 5
  6. 6
  7. 7
  8. 8
  9. 9
  10. 10
  11. 11
  12. 12
  13. 13
  14. 14
  15. 15
  16. 16
  17. 17
  18. 18
  19. 19
  20. 20
  21. 21
  22. 22
  23. 23
  24. 24
  25. 25
  26. 26
  27. 27
  28. 28
  29. 29
  30. 30
  31. 31
  32. 32
  33. 33
  34. 34
  35. 35
  36. 36
  37. 37
  38. 38
  39. 39
  40. 40
  1. from ibm_watsonx_ai import Credentials
  2. from ibm_watsonx_ai import APIClient
  3. from ibm_watsonx_ai.foundation_models import Model, ModelInference
  4. from ibm_watsonx_ai.foundation_models.schema import TextChatParameters
  5. # Set up the API key and project ID for IBM Watson
  6. watsonx_API = "" # below is the instruction how to get them
  7. project_id= "" # like "0blahblah-000-9999-blah-99bla0hblah0"
  8. # Generation parameters
  9. params = TextChatParameters(
  10. temperature=0.7,
  11. max_tokens=1024
  12. )
  13. model = ModelInference(
  14. model_id='meta-llama/llama-3-2-11b-vision-instruct',
  15. params=params,
  16. credentials={
  17. "apikey": watsonx_API,
  18. "url": "https://us-south.ml.cloud.ibm.com"
  19. },
  20. project_id=project_id
  21. )
  22. q = "How to be happy?"
  23. messages = [
  24. {
  25. "role": "user",
  26. "content": [
  27. {
  28. "type": "text",
  29. "text": q
  30. },
  31. ]
  32. }
  33. ]
  34. generated_response = model.chat(messages=messages)
  35. print(generated_response['choices'][0]['message']['content'])

As you might have noticed, you need to have watsonx_API and project_id for authentication when using the watsonx API. The following instructions will guide you how to get them.

Get your Own API key and project ID

watsonx API

You can get the watsonx_API key by first signing up for IBM Cloud at IBM watsonx account.

After you sign up and sign in to IBM watsonx account, you can follow this demonstration to create/get your IBM Cloud user API key at https://cloud.ibm.com/iam/apikeys.

Alt text

Project ID

Next, you need to create/get a project ID for your watsonx.ai project. Go into your project on watsonx.ai and create a project:

Alt text

In the management console, find the project_id.

You can see the project id in:
Management –> General
(⚠️ Wait! ⚠️ it is NOT done yet!). You also need to add a service to your project to make it work.

To add a service to your project (as the image below shows), after you create a project:

⇨ Go to the project's Manage tab.
⇨ Select the Service and integrations page.
⇨ Click Associate Services.
⇨ Add the Watson Machine Learning service to it.

Alt text

🎉 Done! 🎉

Now, you can input the watsonx_API and project_id into the code snippet and try it. The sample output should be similar to this:

Alt text

© IBM Corporation. All rights reserved.

Author(s)

Hailey Quach is a Data Scientist at IBM .

Ricky Shi is a Data Scientist at IBM .

Changelog

Date Version Changed by Change Description
2025/02/05 1.1 Hailey Quach Updated lab